Pii: S0893-6080(00)00081-2
نویسندگان
چکیده
The use of Recurrent Neural Networks is not as extensive as Feedforward Neural Networks. Training algorithms for Recurrent Neural Networks, based on the error gradient, are very unstable in their search for a minimum and require much computational time when the number of neurons is high. The problems surrounding the application of these methods have driven us to develop new training tools. In this paper, we present a Real-Coded Genetic Algorithm that uses the appropriate operators for this encoding type to train Recurrent Neural Networks. We describe the algorithm and we also experimentally compare our Genetic Algorithm with the Real-Time Recurrent Learning algorithm to perform the fuzzy grammatical inference. q 2001 Elsevier Science Ltd. All rights reserved.
منابع مشابه
Pii: S0893-6080(00)00062-9
This article gives an overview of the different functional brain imaging methods, the kinds of questions these methods try to address and some of the questions associated with functional neuroimaging data for which neural modeling must be employed to provide reasonable answers. q 2000 Published by Elsevier Science Ltd.
متن کاملImage denoising using self-organizing map-based nonlinear independent component analysis
This paper proposes the use of self-organizing maps (SOMs) to the blind source separation (BSS) problem for nonlinearly mixed signals corrupted with multiplicative noise. After an overview of some signal denoising approaches, we introduce the generic independent component analysis (ICA) framework, followed by a survey of existing neural solutions on ICA and nonlinear ICA (NLICA). We then detail...
متن کاملAssessing interactions among neuronal systems using functional neuroimaging
We show that new methods for measuring effective connectivity allow us to characterise the interactions between brain regions that underlie the complex interactions among different processing stages of functional architectures.
متن کاملPii: S0893-6080(00)00043-5
It is demonstrated that rotational invariance and reflection symmetry of image classifiers lead to a reduction in the number of free parameters in the classifier. When used in adaptive detectors, e.g. neural networks, this may be used to decrease the number of training samples necessary to learn a given classification task, or to improve generalization of the neural network. Notably, the symmet...
متن کاملBest approximation by Heaviside perceptron networks
In Lp-spaces with p an integer from [1, infinity) there exists a best approximation mapping to the set of functions computable by Heaviside perceptron networks with n hidden units; however for p an integer from (1, infinity) such best approximation is not unique and cannot be continuous.
متن کامل